Getting Started - Evaluating the Library

![]() |
![]() |
Requesting the Evaluation Binary
To request an evaluation binary for a specific architecture, don't hesitate to get in touch with us, providing the following information:
- Platform: NVIDIA Jetson, x86 PC, Qualcomm RBx, NXP i.MX, Xilinx ZYNQ, etc.
- Details of the Platform:
- NVIDIA Jetson:
- Nano, TX2
- Xavier: NX, AGX
- Orin: Nano, NX, AGX
- X86 PC:
- Processor: Intel, AMD (with all the details, i.e., Intel i7-1065G7)
- Graphics Card: NVIDIA, Intel, AMD (with all the details, i.e., NVIDIA T4)
- Qualcomm:
- RBx: RB5, RB6
- NXP:
- i.MX 6
- i.MX 8
- i.MX 95
- Xilinx:
- ZYNQ US+
- ZYNQ 7000
- Kria K26, K24
- NVIDIA Jetson:
- Operating System version, including:
- Kernel version. Use
uname -a
- Distribution version (i.e., Ubuntu 20.04, Yocto Scarthgap)
- If you are on Jetson:
- Jetpack version
- L4T version
- Kernel version. Use
Although other platforms are not mentioned, any other platform with support for OpenCL 2.x support, OpenCV or NVIDIA VPI is supported.
Moreover, please provide a brief description of your use case.
RidgeRun will provide you with the binaries to evaluate the library, which have limitations.
Features of the Evaluation
To help you test our RidgeRun Video Stabilizer Library, RidgeRun can provide an evaluation version of the library.
The following table summarizes the features available in the professional and evaluation versions of the RidgeRun Video Stabilizer Library.
Feature | Professional | Evaluation |
---|---|---|
C++ headers | Y | Y |
Examples | Y | Y |
GStreamer plugin | P (1) | P (1) |
Unlimited Processing Time | Y | N (2) |
Source Code | Y | N |
(1) The GStreamer support is in progress. We are currently working to have it done for the third quarter of 2024.
(2) The evaluation version will limit the processing to a maximum of 9000 executions for each library module. You can also ask for a time-limited evaluation with unlimited features.
Evaluating the RidgeRun Video Stabilizer
Please find the details about the dependencies, installation, and testing below.
Installing dependencies
The RidgeRun Video Stabilization Library has some optional and mandatory dependencies. Within the compulsory dependencies:
- A C++17-compatible compiler (like GNU Compiler v9.x)
- Python3 and pip3
- Ninja and pkg-config
- Meson
One of the following dependencies must be present in the system:
- OpenCV >= 4.2: For CPU image correction
- OpenCL >= 2.2: For HW-accelerated image correction on Non-NVIDIA platforms
- CUDA >= 10.2: For HW-accelerated image correction on NVIDIA (planned support for Q3 2024)
The following dependencies are fully optional:
- Doxygen: for documentation
- Pre-commit: for developer-mode
- Boost (system, filesystem, iostreams): for plotting
- GNUPlot iostream: for plotting
Make sure of having installed the dependencies, see how to install them in the Dependencies section.
RidgeRun Video Stabilization Library does not have support for Windows.
Installing and testing the RidgeRun Video Stabilizer
RidgeRun will provide you with a tarball with the contents of the evaluation version of the RidgeRun Video Stabilizer library.
To test the binary for the evaluation version of RidgeRun Video Stabilizer, please:
- Make sure of having installed the dependencies
- Decompress the provided tarball and enter the directory
- Run the following command:
./install.bash
- If evaluating with a camera, please install the required plugins. These are usually provided with the evaluation package already.
- GstCameraDriverMeta: required for all platforms.
- nvarguscamerasrc: required for NVIDIA Jetson when using the ISP for debayering the camera images.
- qtiqmmfsrc and qtivtransform: required for Qualcomm RBx.
- v4l2src: required for all platforms except when using nvarguscamerasrc or Qualcomm platforms.
For more information about building the required elements see this section.
With this ready, you can start using the evaluation package.
- Now set the environment with the elements:
source ./set-env.bash
Checking the Installation and Environment
Note: Skip this section if using RB5
1. Check the sensor
- BMI160 sensor:
If you have this sensor, connect it with following this guide. Then run this:
# Sensor access: it may change from one devkit to another SENSOR=/dev/i2c-8 bmi160-example ${SENSOR}
- ICM42688 sensor:
If you have this sensor, connect it with following this guide. Then run this:
# Sensor access: it may change from one devkit to another SENSOR=icm42688 icm42600-example ${SENSOR}
- External sensor:
cd assets/external-imu-example make cd - external-imu-example -d assets/external-imu-example/libexternalimu.so
It shall print some readings. Moreover, please note that in assets/external-imu-example you can integrate your own sensor implementation.
This test does not use any IMU, it tests the wrapper for your own integration. If you have a non-BMI160 sensor, it is necessary to go through the integration process. For more information, review this section.
2. Check the CLI tools and Gst elements
For this step just run the following script:
./rvs_env_checker.sh
This script checks on:
- rvs-undistort: Checks that it generates images with the undistortion. This allows the validations of the hardware-accelerated components.
- rvs-camera-calibration and rvs-imu-orientation-calibration: Check that they are available. these are tools required for the calibration (next section).
- Gst elements: Checks that they are in the correct path /opt/rvs-eval.
Test the tool
Now there are two ways of testing the tool:
Test without hardware (proof of concept)
If neither a sensor nor a camera are available in the setup, RidgeRun provides you with an example to test the concept and measure the estimated performance achieved in live stabilization.
1. Check if it is available:
rvs-complete-concept
2. Run according to the guide:
Follow the guide about the Library Integration for IMU Example Application.
Testing with hardware (live stabilization)
RidgeRun also provides all the necessary binaries for testing the stabilization with V4L2 and Argus-compatible cameras. If a camera and an IMU is available in the setup, it is possible to evaluate a live stabilization using GStreamer. Please, follow the steps below to proceed.
Integrating your IMU sensor
At the moment, the RidgeRun Video Stabilisation Library only has built-in support for the BMI160 IMU sensor, if you want to connect this specific sensor follow this guide.
If necessary, it is possible to evaluate other sensors. Please, follow the API Reference / Adding New Sensors guide for more information, particularly the Adding External Sensors section.
Calibrating the Sensors
1. Camera calibration:
For the camera calibration, please follow this guide: Video Undistortion. This calibration will output a matrix necessary for the test with a gstreamer pipeline or your application with this class.
2. IMU calibration:
For the IMU calibration, please follow this guide: Preparing the IMU Measurements. This calibration will output the axes necessary for the test with a gstreamer pipeline or your application with this class.
Live Testing
For this part it is necessary to have an IMU sensor integrated to the camera, either by using the BMI160 or having your own external sensor. You can test the tool with Gstreamer or implement it into your app.
- Gstreamer usage:
Please use the following guide: Example Pipelines.
- API usage:
For API usage, you can use the examples available in our Examples Guidebook.
Cleaning up
Once you finished with the evaluation you can uninstall the tool running the following:
./uninstall.bash
More about the Library
If you want to know more about the library checkout the API Documentation and GStreamer.